94 research outputs found

    Adaptive model selection method for a conditionally Gaussian semimartingale regression in continuous time

    Full text link
    This paper considers the problem of robust adaptive efficient estimating of a periodic function in a continuous time regression model with the dependent noises given by a general square integrable semimartingale with a conditionally Gaussian distribution. An example of such noise is the non-Gaussian Ornstein-Uhlenbeck-Levy processes. An adaptive model selection procedure, based on the improved weighted least square estimates, is proposed. Under some conditions on the noise distribution, sharp oracle inequality for the robust risk has been proved and the robust efficiency of the model selection procedure has been established. The numerical analysis results are given.Comment: 50 pages, 2 figures. arXiv admin note: text overlap with arXiv:1710.03111, arXiv:1712.0645

    Robust adaptive efficient estimation for a semi-Markov continuous time regression from discrete data

    Full text link
    In this article we consider the nonparametric robust estimation problem for regression models in continuous time with semi-Markov noises observed in discrete time moments. An adaptive model selection procedure is proposed. A sharp non-asymptotic oracle inequality for the robust risks is obtained. We obtain sufficient conditions on the frequency observations under which the robust efficiency is shown. It turns out that for the semi-Markov models the robust minimax convergence rate may be faster or slower than the classical one.Comment: arXiv admin note: text overlap with arXiv:1604.0451

    Sequential Model Selection Method for Nonparametric Autoregression

    Full text link
    In this paper for the first time the nonparametric autoregression estimation problem for the quadratic risks is considered. To this end we develop a new adaptive sequential model selection method based on the efficient sequential kernel estimators proposed by Arkoun and Pergamenshchikov (2016). Moreover, we develop a new analytical tool for general regression models to obtain the non asymptotic sharp or- acle inequalities for both usual quadratic and robust quadratic risks. Then, we show that the constructed sequential model selection proce- dure is optimal in the sense of oracle inequalities.Comment: 30 page

    APPROXIMATE HEDGING PROBLEM WITH TRANSACTION COSTS IN STOCHASTIC VOLATILITY MARKETS

    Get PDF
    International audienceThis paper studies the problem of option replication in general stochastic volatility markets with transaction costs, using a new specification for the volatility adjustment in Leland's algorithm. We prove several limit theorems for the normalized replication error of Leland's strategy, as well as that of the strategy suggested by Lépinette. The asymptotic results obtained not only generalize the existing results, but also enable us to fix the underhedging property pointed out by Kabanov and Safarian. We also discuss possible methods to improve the convergence rate and to reduce the option price inclusive of transaction costs

    Adaptive efficient analysis for big data ergodic diffusion models

    Get PDF
    We consider drift estimation problems for high dimension ergodic diffusion processes in nonparametric setting based on observations at discrete fixed time moments in the case when diffusion coefficients are unknown. To this end on the basis of sequential analysis methods we develop model selection procedures, for which we show non asymptotic sharp oracle inequalities. Through the obtained inequalities we show that the constructed model selection procedures are asymptotically efficient in adaptive setting, i.e. in the case when the model regularity is unknown. For the first time for such problem, we found in the explicit form the celebrated Pinsker constant which provides the sharp lower bound for the minimax squared accuracy normalized with the optimal convergence rate. Then we show that the asymptotic quadratic risk for the model selection procedure asymptotically coincides with the obtained lower bound, i.e this means that the constructed procedure is efficient. Finally, on the basis of the constructed model selection procedures in the framework of the big data models we provide the efficient estimation without using the parameter dimension or any sparse conditions
    corecore